Information entropy, information distances, and complexity in atoms
نویسندگان
چکیده
منابع مشابه
Information entropy, information distances, and complexity in atoms.
Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 < or = Z < or = 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a + b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measu...
متن کاملInformation Distances versus Entropy Metric
Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected val...
متن کاملEntropy, complexity, and spatial information
We pose the central problem of defining a measure of complexity, specifically for spatial systems in general, city systems in particular. The measures we adopt are based on Shannon's (in Bell Syst Tech J 27:379-423, 623-656, 1948) definition of information. We introduce this measure and argue that increasing information is equivalent to increasing complexity, and we show that for spatial distri...
متن کاملEvolution of Information and Complexity in an Ever-Expanding Universe
Using the usual definitions of information and entropy in quantum gravity and statistical mechanics and the existing views about the relation between information and complexity, we examine the evolution of complexity in an ever expanding universe.
متن کاملentropy, negentropy, and information
evaluation criteria for different versions of the same database the concept of information, during its development, is connected to the concept of entropy created by the 19th century termodynamics scholars. information means, in this view point, order or negentropy. on the other hand, entropy is connected to the concepts as chaos and noise, which cause, in turn, disorder. in the present paper, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: The Journal of Chemical Physics
سال: 2005
ISSN: 0021-9606,1089-7690
DOI: 10.1063/1.2121610